Market Roundup

April 25, 2003

 

 

 

Doing What You Gotta Do: Microsoft Releases Windows Server 2003

IBM Launches “Deep Computing” Initiative

Who’s Next?

SNIA-E Releases European Storage Survey

Hot Fix? Not So Fast!

WebMethods and Informatica Announce Business Activity Platform Availability

 

 

 

Doing What You Gotta Do: Microsoft Releases Windows Server 2003

By Clay Ryder

Microsoft has announced the general availability of Windows Server 2003, Visual Studio .NET 2003, and SQL Server 2000 Enterprise Edition (64-bit). The company indicated that this trio represents the cornerstone of its enterprise strategy and lays the foundation for customers to implement integrated, cost-effective solutions that connect information, people, systems, and devices. “Doing more with less” was the mantra of the day as the company demonstrated several new capabilities of WS 2003 targeted at providing an interoperable enterprise infrastructure at lower cost, higher performance, and lower IT operational cost (in part due to a new focus on self-service IT) than its predecessor. Microsoft indicated that WS 2003 operates 30% more efficiently than Windows NT 4.0 and is the first Windows Server release to support 64-bit Itanium systems. With this 64-bit support, the company also announced that it is claiming the highest performance based on the tpmC benchmark compared with any non-clustered RISC-based UNIX solution. The WS 2003 family includes the following configurations: WS 2003 Datacenter Edition (32-bit and 64-bit), WS 2003 Enterprise Edition (32-bit and 64-bit), WS 2003 Standard Edition, and WS 2003 Web Edition – all of which are now available – and the Windows Small Business Server 2003, which will be available in Q3 2003.

Given the economic and IT doldrums that are seemingly as difficult to escape as the proverbial death and taxes, it does raise the question of whether or not this is an opportune time to launch a major new release to Microsoft’s flagship server operating system. Although WS 2003 offers several new capabilities, it is noteworthy that a substantial portion of the product’s focus seems to be on enabling a more self-service approach to corporate computing. Supporting multiple versions for server-based documents (a ho-hum feat to those who used VMS, but a big deal for Windows users) not only saves the user who accidentally trashed a critical file, but more importantly eliminates frantic calls to the IT help desk. In addition, the enhancements to SharePoint Services have taken some of the mundane tasks IT faces, such as creating new directories and file partitions, granting access to the appropriate individuals, etc. when new project teams are established, and transfers this activity to the users. Relieving IT of these burdensome tasks is reflective of the doing-more-with-less theme, as Microsoft seeks to automate or create self-service methods for fulfillment of mundane but essential IT processes.

It is also clear that WS 2003 is an operating system that expects an n-tier environment in the enterprise to become the operating norm. Features such as providing remote access to a user’s corporate desktop illustrate that the server is more than a discrete resource, but can become a conduit for enterprise computing that either facilitates information exchange between machines or stands out of the way as warranted. Given Microsoft’s continuing push for adoption of .NET based applications and services, it is clear that the company is positioning itself as a provider of applied infrastructure products, as opposed to a vendor of point products, be they client/server operating systems or desktop applications. As standards-based network services such as Web Services, XML engines, and other middleware increasingly become the focal points of application development, thus enabling J2EE and .NET as the effective operating environment for enterprise applications, we are witnessing another layer of abstraction in both the server itself and the resources under its control. Thus the nuts and bolts, or more appropriately the hardware, firmware, and base level software underpinnings of the enterprise IT infrastructure are no longer the design criteria for application development as these give way to n-tier network focused environments. But all this aside, WS 2003 represents a continuing evolution of the server operating system, at least in Microsoft’s eyes. If past history offers any indication of future behavior, we expect WS 2003 to stand as another step in the Redmond Giant’s quest to advance its agenda and crown itself as the king of the enterprise as it has in the consumer PC market.

 

IBM Launches “Deep Computing” Initiative

By Charles King

IBM has announced its creation of the “Deep Computing” strategic initiative, which will leverage IBM hardware, software, emerging technologies, research initiatives, and industry expertise to benefit high performance computing (HPC) customers. The company is appointing a dedicated Deep Computing team that will take a comprehensive approach to IBM’s HPC efforts, blending existing hardware and software with emerging open standards such as Grid protocols into high performance solutions. Dave Turek was named vice president of the Deep Computing group and Peter Ungaro will direct Deep Computing sales.

We do not usually pay a great deal of attention to the formation of new business groups, but a number of issues surrounding IBM’s Deep Computing are well worth considering. First, IBM’s history and strength in supercomputing and the HPC space makes the formation of a dedicated team a natural step for the company to take. We assume that the tram’s Deep moniker refers to the chess-playing Deep Blue system that beat grandmaster Gerry Kasparov. In fact, the technologies developed for Deep Blue were adapted for the IBM eServer p655, an 8-way POWER4-based system developed specifically for clustered HPC applications. The continuing migration of clustering technologies into the once monolithic world of supercomputing is the second piece of the Deep Computing puzzle we find particularly interesting. The time when high-performance computing meant Big Iron is rapidly fading, replaced by highly clustered configurations of scores or even hundreds of RISC and commodity servers. The penetration of such systems can be seen clearly in the rankings organized by the Top500.org Web site, which ranks supercomputers worldwide.

So what does all this have to do with IBM’s Deep Computing initiative? While the sheer cost and complexity of HPC systems tends to aim them primarily at research and government lab settings, the continuing declines in hardware prices along with enhancements in performance are making commercial applications of HPC increasingly viable and enticing. IBM has found notable success with commercial HPC solutions for the automotive, pharmaceutical, and oil/gas exploration industries, and we expect those applications and sales wins to continue as hardware of every kind becomes increasingly affordable. Hence the formation of Deep Computing, to continue developing cutting edge solutions for traditional HPC customers, and to drive commercial applications of those same technologies. Overall, we see IBM’s Deep Computing initiative both as a simple continuation of one of the company’s traditional strong points, and as a concerted effort to deliver what were once considered highly esoteric computing technologies into broader markets.

 

Who’s Next?

By Jim Balderston

Siebel Systems and PeopleSoft – two of the market’s largest CRM vendors – released their earnings reports this week. Siebel reported revenues of $332.8 million and a net income of $4.6 million for the quarter. Revenues came largely from maintenance, consulting, and services, at $220.7 million with license fees bringing in $112.1 million for the quarter. Siebel’s profits were off 93% from the first quarter last year. PeopleSoft announced that it had revenues of $460.3 million and a profit of $38 million. Both figures were off from a year ago. License fees fell 39% compared to a year ago, dropping to $80.8 million. PeopleSoft announced it was cutting 200 jobs.

We have not been shy about our criticisms of CRM vendors and with these results in hand there is no reason to change course. While many believe that all that is needed is a turnaround in the economy for CRM vendors to once again be making financial waves in the marketplace, we see signs that even a robust economy would not necessarily give these folks the kind of boost that they enjoyed in the go-go days of CRM.

One would think that in a down market, CRM – or customer care and feeding applications – would be in great demand as companies try to hold onto the customers they have and maybe increase sales to those customers without the high risk move of increasing staff. Yet these numbers show that hypothesis is not being realized. Instead, these numbers indicates that despite the need to hold onto customers in a down economy, CRM is not the answer for many businesses. In fact, it would appear that CRM spending was – and may always be — more of a discretionary outlay, one on a par with advertising. And that the market – as the dozens of failed tech publications can attest to – is very fickle indeed. While we believe that offering applications and technology that will help companies keep their customers close at hand is an intriguing opportunity and potentially valuable offering, we are not sure that the last wave of CRM vendors has laid the groundwork for the next generation of customer care applications. When one considers the high cost and complexity of CRM offerings – as documented by the revenues figures for Siebel Systems – one can’t help but think that customer care was the not the prime motivation of product development at Siebel. It would appear that long-term customer commitments and investment were the sought-after prime directive. As a result, the idea of helping companies weather the down times by holding onto their existing customer base may have fallen by the wayside. We suspect customer care and feeding applications may well have to come from a different direction, from different vendors, who actually might deliver on their promises. Any takers?

 

SNIA-E Releases European Storage Survey

By Charles King

A market research report sponsored by the Storage Networking Industry Association Europe (SNIA-E) suggests that enterprise storage requirements are continuing to expand notably in the European market. Omarketing Limited conducted the study, which queried 100 respondents representing storage professionals from end-user organizations throughout Europe. Responses were collected from September 2002 through January 2003 at a variety of European events and via the SNIA-E Web site. According to the report’s executive summary, 30% of respondents claimed that storage requirements had grown by 100% or more in the previous twelve months, and 9% reported storage growth exceeding 200%. Additionally, the group as a whole expects storage growth to continue at approximately the same rate over the coming year. Key applications cited by respondents as driving storage growth were email (63%) and databases (73%). eCommerce (22%) and CRM (28%) represented the smallest growth categories trailing images, videos, and MP3s (30%). Nearly four in ten (38%) survey participants said TCO is the key driver in making storage purchasing decisions, though slightly smaller numbers also cited storage management, downtime, integration, and scalability as critical issues. Nearly a third of respondents expressed interest in deploying virtualization solutions during the next six months.

The SNIA-sponsored study offers both items of interest and cautionary notes regarding the value of market research. To begin, we were troubled by how some data points were presented in the executive summary. For example, the study suggests that 86% of respondents claimed their storage needs increased by more than 50%, while 14% saw no change or a decrease, leading to the claim that storage needs are outstripping Moore’s Law. What we wondered was why no respondents claimed (or perhaps were even asked) to indicate increases between 1-50%. A small thing, perhaps, but one that niggles a bit. We were also disturbed by the study’s suggestion that the respondents’ interest in virtualization indicated both the timeliness of SNIA’s Bluefin-based Storage Management Initiative and a maturing of the market for virtualization. While users appear interested in virtualization and similar centralized management solutions for heterogeneous storage environments, we believe the SNIA’s conclusion has at least the appearance of being a bit high handed, especially considering the still-nascent state of virtualization technologies. We were also interested to note that while virtualization was fourth on the list of storage areas of interest for the next twelve months, the three technologies preceding it were not mentioned at all in the summary.

Despite those shortcomings, the study did offer some food for thought. Users’ ongoing interest in disaster recovery and data backup were particularly intriguing when contrasted with their essentially lackadaisical attitudes toward testing the systems and procedures involved in these areas. Some may regard this as a simple sign of incompetence, but it may instead indicate the level of difficulty end users have in establishing and maintaining effective disaster recovery programs. If that is the case, it could point to opportunities for storage service providers and storage vendors focused on delivering utility-style storage services. We were also intrigued by the 38% of users whose organizations deploy storage environments that blend various mixes of DAS, SAN, and NAS. Clearly, European enterprises have the same tendencies as their stateside counterparts in deploying storage solutions however and wherever they please. This tendency seems to us to be driving the interest in virtualization-style solutions, whether they come from the SNIA or elsewhere. Overall, while we found the SNIA-E’s storage study of some interest, we believe that if the group had been more balanced in analyzing and presenting the data they collected, they would have brought more credibility to themselves and their members.

 

Hot Fix? Not So Fast!

By Jim Balderston

Microsoft released its latest security patch for Windows XP on April 16 and a number of users reported that the patch, which was rated the second highest priority of “important” by Microsoft, affected computer performance. Microsoft has said it is working to resolve the issues that some users reported on a number of user group online discussion boards, and is investigating the problem. The security flaw that the patch was supposed to address could allow an attacker to alter their privilege level on a susceptible system.

There may be a parallel between the Microsoft “important” rating of this patch and the nation’s color coded security alert system. One has to wonder to what degree Microsoft’s rating system is a product of actual threats and one that is designed to fend off the hordes of anti-Microsoft techies that regularly – and in many cases, accurately – describe or reveal Microsoft operating system vulnerabilities.

Yet this incident raises a series of interesting questions concerning the whole concept of software as a subscription, one that is constantly updated, tweaked, and modified as new developments demand. In such an environment, does the code base need to be fundamentally re-engineered in a way that will prevent patches from causing havoc? Since we are essentially in the early stages of such a software model, one can argue with reasonable certainty that all the potential bugs in an ever-evolving code base have not been worked out. While we believe that the software-as-subscription model is one that is here to stay, we suspect that its desirable properties – i.e., always up-to-date code – will require a little more rework on the back end as a result of conflicts like this example. Over time, the modularity of the code base that is the core of a subscription model will mature, but to do so will require a substantial effort at both increased stability and “healability” combined with the need to maintain some level of backwards compatibility. Incremental updates remain an improvement over wholesale OS upgrades or new releases, but we suspect that the real work in perfecting the software-as-subscription model lies out ahead of us, and as such remains a formidable challenge for any and all that undertake it.

 

WebMethods and Informatica Announce Business Activity Platform Availability

By Myles Suer

WebMethods and Informatica this week announced the release of their jointly developed Business Activity Platform, which combines WebMethods’ Integration Platform with Informatica's PowerCenter data integration and PowerAnalyzer business intelligence software. The companies claim their integrated offering creates real-time visibility in a business intelligence dashboard, and say the new solution extends the typical limits of business intelligence software by enabling real time data and analysis, interpreting data to formulate and invoke optimal responses. Both HP and Accenture have agreed to be systems integrators for the Business Activity Platform. The product is now available. Pricing was not announced.

Web Methods and Informatica have performed an interesting balancing act with their new Business Activity Platform, first in asserting that business intelligence cannot handle the range of problems that the new system can, but also by claiming that the new system offers users additional value by correlating real time data against historical information. Since companies including Brio are making the same argument, clearly Business Activity Monitoring has become the latest cool enterprise software offering. But as with any new technology, product definitions provide the basis of much of the attendant marketing hype. WebMethods and Informatica have joined the party by asserting they can monitor any business process by, for example, layering on top of data center management software from BMC Patrol and HP Openview.

While we live in an age when software must demonstrate a real ROI, to create such value systems need to take on unique tasks or provide demonstrable solutions for specific problems. To our way of thinking, bridging between analytics and business planning and budget offers a unique value proposition. For this reason, we believe Cognos and Hyperion have taken the right tack by focusing on the use of business activity monitoring to create indicators of a company’s business health. Although a personal dashboard such as that provided in the Business Activity Platform can be a fun technology, having one that can monitor every enterprise business process does not create unique, measurable ROI. In the end, simply acquiring real time data is of limited usefulness without a larger context.

 

 

The Sageza Group, Inc.

836 W El Camino Real

Mountain View, CA 94040-2512

650·390·0700     fax 650·649·2302

London +44 (0) 20·7900·2819

Munich +49 (0) 89·4201·7144

 

sageza.com

 

Copyright © 2003 The Sageza Group, Inc.

May not be duplicated or retransmitted without written permission